Forward- or Reverse-Mode Automatic Differentiation: What's the Difference?
نویسندگان
چکیده
Automatic differentiation (AD) has been a topic of interest for researchers in many disciplines, with increased popularity since its application to machine learning and neural networks. Although appreciate know how apply AD, it remains challenge truly understand the underlying processes. From an algebraic point view, however, AD appears surprisingly natural: originates from laws. In this work we use Algebra Programming techniques reason about different variants, leveraging Haskell illustrate our observations. Our findings stem three fundamental abstractions: (1) notion semimodule, (2) Nagata's construction ‘idealization module’, (3) Kronecker's delta function, that together allow us write single-line abstract definition AD. definition, by instantiating structures various ways, derive have same extensional behaviour, but intensional properties, mainly terms (asymptotic) computational complexity. We show variants equivalent means Kronecker isomorphisms, further elaboration infrastructure which guarantees correctness construction. With framework place, paper seeks make more comprehensible, taking perspective on matter.
منابع مشابه
Forward-Mode Automatic Differentiation in Julia
We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support fo...
متن کاملThe Stan Math Library: Reverse-Mode Automatic Differentiation in C++
As computational challenges in optimization and statistical inference grow ever harder, algorithms that utilize derivatives are becoming increasingly more important. The implementation of the derivatives that make these algorithms so powerful, however, is a substantial user burden and the practicality of these algorithms depends critically on tools like automatic differentiation that remove the...
متن کاملEnabling user-driven Checkpointing strategies in Reverse-mode Automatic Differentiation
Abstract. This paper presents a new functionality of the Automatic Differentiation (AD) Tool tapenade. tapenade generates adjoint codes which are widely used for optimization or inverse problems. Unfortunately, for large applications the adjoint code demands a great deal of memory, because it needs to store a large set of intermediates values. To cope with that problem, tapenade implements a su...
متن کامل"To be recorded" analysis in reverse-mode automatic differentiation
The automatic generation of adjoints of mathematical models that are implemented as computer programs is receiving increased attention in the scientific and engineering communities. Reverse-mode automatic differentiation is of particular interest for large-scale optimization problems. It allows the computation of gradients at a small constant multiple of the cost for evaluating the objective fu...
متن کاملWho Invented the Reverse Mode of Differentiation?
Nick Trefethen [13] listed automatic differentiation as one of the 30 great numerical algorithms of the last century. He kindly credited the present author with facilitating the rebirth of the key idea, namely the reverse mode. In fact, there have been many incarnations of this reversal technique, which has been suggested by several people from various fields since the late 1960s, if not earlie...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Science of Computer Programming
سال: 2023
ISSN: ['1872-7964', '0167-6423']
DOI: https://doi.org/10.1016/j.scico.2023.103010